# multilingual generation
Aya 23 8B
Aya-23 is an open-weight research version of an instruction fine-tuned model with highly advanced multilingual capabilities, supporting 23 languages.
Large Language Model
Transformers Supports Multiple Languages

A
CohereLabs
10.28k
415
GLM 4 32B Base 0414
MIT
GLM-4-32B-Base-0414 is a large language model with 32 billion parameters, pre-trained on 15T high-quality data, supporting both Chinese and English, and excels in tasks such as code generation and function calling.
Large Language Model
Transformers Supports Multiple Languages

G
THUDM
995
21
Mistral Nemo Base 2407
Apache-2.0
Mistral-Nemo-Base-2407 is a 12-billion-parameter generative text pre-training model jointly trained by Mistral AI and NVIDIA, outperforming existing models of similar or smaller scale.
Large Language Model
Transformers Supports Multiple Languages

M
mistralai
44.76k
304
Featured Recommended AI Models